Optimal Bounds for Johnson-Lindenstrauss Transformations

نویسندگان

  • Michael Burr
  • Shuhong Gao
  • Fiona Knoll
چکیده

In 1984, Johnson and Lindenstrauss proved that any finite set of data in a high-dimensional space can be projected to a lower-dimensional space while preserving the pairwise Euclidean distance between points up to a bounded relative error. If the desired dimension of the image is too small, however, Kane, Meka, and Nelson (2011) and Jayram and Woodruff (2013) independently proved that such a projection does not exist. In this paper, we provide a precise asymptotic threshold for the dimension of the image, above which, there exists a projection preserving the Euclidean distance, but, below which, there does not exist such a projection.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Johnson-Lindenstrauss Lemma Is Optimal for Linear Dimensionality Reduction

For any n > 1 and 0 < ε < 1/2, we show the existence of an n-point subset X of R such that any linear map from (X, `2) to ` m 2 with distortion at most 1 + ε must have m = Ω(min{n, ε−2 logn}). Our lower bound matches the upper bounds provided by the identity matrix and the Johnson-Lindenstrauss lemma [JL84], improving the previous lower bound of Alon [Alo03] by a log(1/ε) factor.

متن کامل

Almost Optimal Explicit Johnson-Lindenstrauss Transformations

The Johnson-Lindenstrauss lemma is a fundamental result in probability with several applications in the design and analysis of algorithms in high dimensional geometry. Most known constructions of linear embeddings that satisfy the Johnson-Lindenstrauss property involve randomness. We address the question of explicitly constructing such embedding families and provide a construction with an almos...

متن کامل

The Fast Johnson-lindenstrauss Transform

While we omit the proof, we remark that it is constructive. Specifically, A is a linear map consisting of random projections onto subspaces of Rd. These projections can be computed by n matrix multiplications, which take time O(nkd). This is fast enough to make the Johnson-Lindenstrauss transform (JLT) a practical and widespread algorithm for dimensionality reduction, which in turn motivates th...

متن کامل

New bounds for circulant Johnson-Lindenstrauss embeddings

This paper analyzes circulant Johnson-Lindenstrauss (JL) embeddings which, as an important class of structured random JL embeddings, are formed by randomizing the column signs of a circulant matrix generated by a random vector. With the help of recent decoupling techniques and matrix-valued Bernstein inequalities, we obtain a new bound k = O(ǫ log(n)) for Gaussian circulant JL embeddings. Moreo...

متن کامل

A Summary of Some Interesting Bounds in Estimation and Learning

These review notes serve as a guide for myself to some bounds of interest in the estimation theory and learning theory, including Cramér-Rao Bound (CRB), concentration inequalities, Vapnic-Chervonenkis (VC) theory, probably approximately correct (PAC) learning, and the Johnson-Lindenstrauss (JL) lemma.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2018